Skip to content

Conversation

@ozzieba
Copy link

@ozzieba ozzieba commented Jan 19, 2018

Adding tests in response to apache/spark#20272

@ozzieba
Copy link
Author

ozzieba commented Jan 19, 2018

woops, looks like merging from master broke my function...

@ozzieba
Copy link
Author

ozzieba commented Jan 19, 2018

Any way to get Jenkins to pull spark from my branch?

</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you need to depend on hadoop-common?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without it I get java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration. In general, I believe Hive JDBC requires some Hadoop dependencies, as discussed in eg https://issues.apache.org/jira/browse/HIVE-15110. Not sure if it's only configuration. Do you have a cleaner solution?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, thanks for the info.

import java.util.UUID
import java.util.regex.Pattern
import java.sql.DriverManager
import org.apache.hive.jdbc.HiveDriver
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There should be an empty line separating java imports and the rest, and the order of imports should be java, scala, other third-party, and finally spark. So org.apache.hive.jdbc.HiveDriver should be to after io.fabric8....

}

private def runThriftServerAndVerifyQuery(
driverPodChecker: Pod => Unit = doBasicDriverPodCheck,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indention for function parameters is wrong. Should be four spaces.

assert(resultSet.getInt(1) == 42)
} finally {
statement.close()
connection.close()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

connection should be closed regardless of if statement.close throws an exception.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ozzieba
Copy link
Author

ozzieba commented Jan 19, 2018

@liyinan926 thanks for checking this out, I'm still somewhat new to Scala, let me know if anything else needs to be fixed

}

private def runThriftServerAndVerifyQuery(
driverPodChecker: Pod => Unit = doBasicDriverPodCheck,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These should be 4 spaces indention.

@liyinan926
Copy link
Member

liyinan926 commented Jan 19, 2018

LGTM with a couple of minor comments. BTW: the added test failed in the last integration test run somehow.

@ozzieba
Copy link
Author

ozzieba commented Jan 19, 2018

@liyinan926 Thanks. The test succeeds using my fork of the Spark repo, and presumably should be merged together with apache/spark#20272. Is there a way to tell Jenkins to test my fork before this?

@liyinan926
Copy link
Member

liyinan926 commented Jan 19, 2018

dev/dev-run-integration-tests.sh has an option --spark-repo that allows you to specify a repo to test against. Were you using that to test against your fork? You can post your manual test result to apache/spark#20272.

@ozzieba
Copy link
Author

ozzieba commented Jan 20, 2018

I ran it that way successfully on my own machine, just wanted to make it ”official” on Jenkins

@foxish
Copy link
Member

foxish commented Jan 21, 2018

@ozzieba, looks like the test is failing here. Can you PTAL?

@ozzieba
Copy link
Author

ozzieba commented Jan 21, 2018

@foxish The failure in Jenkins is expected, since SparkSubmit before my PR explicitly blacklists STS from running on cluster mode.

For the record, here is the log of the tests running after my change

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants